Snake Oil

CONSULT January 2019

Main Content

UMMC experts distinguish substantive efforts, junk science

Published on Tuesday, January 1, 2019

By: Gary Pettus

NOTE: This article originally appeared in the January 2019 issue of CONSULT, UMMC's monthly electronic newsletter. To have CONSULT, and more stories like this, delivered directly to your inbox, click here to subscribe.


Drinking orange juice delays the risk of dementia in men.

Eating dark chocolate improves your memory.

Oversleeping can lead to heart disease – or death.

Our nation’s scientists are busy these days, announcing findings, such as the ones described above, which often sound too good, or bad, to be true. But how do you know if they are?

Take the three aforementioned studies: They’ve been reported widely, yet come with some not-as-widely-reported limitations. More on that later.

Portrait of Dr. Gailen Marshall
Marshall

For now, to put things in perspective, we turn to two experts at the University of Mississippi Medical Center: Dr. Gailen Marshall and Dr. Javed Butler.

“It’s not all that easy sometimes to know if you’re listening to a snake oil salesman,” said Marshall, professor of medicine and R. Faser Triplett Sr. M.D. Chair of Allergy and Immunology.

“The old adage, ‘If it’s too good to be true, it probably is,’ is a good one,” said Marshall, who is also the editor of the national journal “The Annals of Allergy, Asthma and Immunology.”

Portrait of Dr. Javed Butler
Butler

Butler, professor and chair of the Department of Medicine and former director of heart failure research at Emory University, chimed in with this caution: “My pet peeve with scientists is that they may fall in love with their hypotheses and exaggerate the first positive finding.”

And in this media-obsessed era, Butler said, “findings are sometimes given too much air, and the public goes crazy.”

“The trouble with many news reports is that they focus on the ‘highlights’ of a study,” Marshall added. “But details matter.”

A study reporting on the reporting of studies – yes, there is such a thing – isolates another weakness in many media accounts of possible breakthroughs: “[W]e found that newspapers were more likely to cover observational studies and less likely to cover randomized trials than high-impact journals.”

If you’re asking, ‘So what?’ – well, it may help to know the difference between observational studies and randomized trials.

In observational studies, researchers monitor certain individuals or measure certain outcomes – without trying to affect the outcomes. No one receives a treatment, for instance. These studies are useful when researchers can’t dive into a certain question any other way, such as: Are certain people more susceptible to heart disease?

To answer that question, it would be wrong to try to give someone a heart attack.

On the other hand, if you find, through observation, that people who read Stephen King novels are more likely to have heart attacks, you could be ignoring what researchers call potential “confounding biases” – for instance, the possibility that people who read Stephen King also eat more bacon.

By comparison, randomized trials are what many scientists refer to as the “gold standard” of research. Because people are randomly assigned to one or more study groups – one receives a new treatment, say, and the other doesn’t – there is not as much left to chance.

Still, not all questions can be answered this way. Also, these trials can take many years – and dollars – to complete.

Of course, you could cut out the middle man – the media – and go online to scan the scholarly research journals for the original findings. But you’re liable to run across such actual sentences as this: “[S]mall enhancements in visual acuity and large-letter contrast sensitivity and a slightly larger improvement in small-letter contrast sensitivity were noted after consumption of dark compared with milk chocolate.”

While all that may be part of life’s rich pageant, what does it mean?

Beyond that, another minefield lurks: possible conflicts of interest, which can color a researcher’s conclusions, even unintentionally. For instance, if the alcoholic beverage industry funds a study showing that moderate drinking can be good for your health, you might want to take your next margarita with a grain elevator of salt.

“I may be naïve,” Butler said, “but I don’t believe that people actually blatantly lie. But if something seems to maybe work a little in their research, they, as human beings, might exaggerate how effective it is or how much benefit is there, because of some conflicts.”

Unfortunately, this coin has two sides. The other one is the “negatively conflicted” person – someone who may inflate a study’s flaws because of his or her own allegiances.

Obviously, then, there is no foolproof way to know all the facts and qualifiers in research. In Butler’s opinion, it’s especially difficult to be reasonably sure that a conclusion is justified unless you speak to someone who is an expert in the field. But, unless they know you, experts may not answer the phone.

So that leaves us with some general guidelines and warnings.

First of all, Marshall said, look for these red-flag statements:

• “This finding has never been reported before.”

• “This is the definitive answer.”

• “I’m the only one who is doing this.”

All of these claims can be true, Marshall said, but realize this: Almost all knowledge in research is based on something that came before and could be turned on its head by something that is yet to be.

“If the conclusion of a report says, ‘This is what the research shows and this is the next step,’ that is more valid,” Butler said. “Science is never finished.”

An additional red flag: “Any claim that says, ‘This will help everyone,’” Marshall said. “About the only things everyone participates in are death and taxes.”

Lifestyle choices and environments come in many flavors. Not everyone lives in a mansion; not everyone drinks beer for breakfast. You can’t account for everyone.

This brings up a related issue: Drug commercials on TV.

“Pharmaceutical companies assemble focus groups to decide what a product will be named, its color, its shape – all to maximize its appeal to patients,” Marshall said. “That means minimizing the drawbacks.”

And the “less fatal” a disease is, the more a drug will be hyped. Because a drug that saves your life pretty much sells itself.

As for those that may help you live longer, even then you should ask these questions: How much longer? And at what cost?

“Is it worth hundreds of thousands of dollars to live just a few months longer?” Marshall asked. Of course, that’s up to you.

Here’s another gut-puncher: “Placebos work around 30 percent of the time in clinical studies,” Marshall said. Look at that number again: Nearly one out of three patients show improvement with a treatment they don’t even get but think they do.

“To be fair to the scientists and companies reporting these claims, the audience doesn’t always listen, and doesn’t ask questions,” Marshall said, “especially if they are desperate for a solution. They’ve seen too many Hollywood movies where a patient is given something and then just jumps up off the table. This becomes their reality.”

Speaking of reality: that dark chocolate study? The answer is: To get the health benefits described, you’d have to eat the equivalent of about seven average-sized candy bars daily. Goodbye, amnesia; hello, diabetes.

Orange juice vs. dementia? That’s only for healthy, middle-aged adults whose Tropicana Pure Premium OJ must be laced with pulp – a drink not currently available to the public. Also note: One 10-ounce serving is swimming with 28 grams of sugar; that’s seven teaspoons. Maybe you should just eat an orange.

Finally, that sleep study: While it’s true that sleeping too long is linked to heart disease, it’s also true that it causes a “moderate degree of harm” compared with not sleeping long enough.

But let’s wind this up on a hopeful note.

“The majority of people bringing new discoveries to medicine and science are trying to do good things,” Marshall said. “If we’re skeptical, we’re more likely to discover the less useful things. Skeptical, not cynical.

“Cynicism says, ‘There’s no truth in it.’ Skepticism says, ‘Prove it to me.’”